# 32k long context
Nomic Embed Code GGUF
Apache-2.0
A code retrieval embedding model trained based on Qwen2, supporting multiple programming languages to facilitate efficient code search and matching.
Text Embedding
N
lmstudio-community
622
1
Qwen3 4B Mishima Imatrix GGUF
Apache-2.0
A Mishima Imatrix quantized version based on Qwen3-4B, enhanced with specific datasets for prose-style generation
Large Language Model
Q
DavidAU
105
2
Phi 4 Reasoning Plus GGUF
MIT
Phi-4-reasoning-plus is a large language model developed by Microsoft with enhanced reasoning capabilities, specifically optimized for complex mathematical problems and multi-step reasoning tasks.
Large Language Model Supports Multiple Languages
P
lmstudio-community
5,205
4
Norwai Mistral 7B Instruct
A large language model fine-tuned for Norwegian language instructions based on Mistral-7B, developed by the Norwegian University of Science and Technology NowAI Research Center and partners
Large Language Model
N
NorwAI
18.62k
10
It 5.3 Fp16 32k
Apache-2.0
Cyclone 0.5* is an open-source large language model supporting Russian and English, extending the RoPE context window to 32k and optimizing JSON processing and multi-turn dialogue capabilities
Large Language Model
Transformers Supports Multiple Languages

I
Vikhrmodels
74
11
Dark Miqu 70B
Other
A 70B parameter large language model specialized in dark style creative writing, supporting 32k context length, developed based on miqu-1-70b, excelling in dark/grimdark fantasy themes
Large Language Model
Transformers

D
jukofyork
46
30
Mistral 7B Instruct V0.2
Apache-2.0
Mistral-7B-Instruct-v0.2 is a large language model fine-tuned for instruction following based on Mistral-7B-v0.2, supporting a 32k context window with sliding window attention mechanism removed.
Large Language Model
Transformers

M
mistralai
1.1M
2,737
Featured Recommended AI Models